Kikuchi-Bayes: Factorized Models for Approximate Classification in Closed Form

نویسندگان

  • Aleks Jakulin
  • Ivan Bratko
  • Irina Rish
چکیده

We propose a simple family of classification models, based on the Kikuchi approximation to free energy, that generalize upon the naive Bayesian classifier. The resulting product of potentials is not normalized, but for classification it is easy to perform the normalization for each instance separately, just as in naive Bayes. Our learning algorithm creates the set of initial regions by including only those initial regions that provide a significant improvement to the approximation which does not include them. We observe that this algorithm outperforms other methods, such as the tree-augmented naive Bayes, but that the inclusion of regions may increase the approximation error. For that reason we recommend separating the generalization error which arises from the mismatch between the training and test data, and approximation error which arises because of imperfect model. The approximation error was the dominant source of variation in experiments we performed on realistic data sets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Comparison of Variational Bayes and Markov Chain Monte Carlo Methods for Topic Models

Latent Dirichlet Allocation (LDA) is Bayesian hierarchical topic model which has been widely used for discovering topics from large collections of unstructured text documents. Estimating posterior distribution of topics as well as topic proportions for each document is the goal of inference in LDA. Since exact inference is analytically intractable for LDA, we need to use approximate inference a...

متن کامل

A New Approach for Text Documents Classification with Invasive Weed Optimization and Naive Bayes Classifier

With the fast increase of the documents, using Text Document Classification (TDC) methods has become a crucial matter. This paper presented a hybrid model of Invasive Weed Optimization (IWO) and Naive Bayes (NB) classifier (IWO-NB) for Feature Selection (FS) in order to reduce the big size of features space in TDC. TDC includes different actions such as text processing, feature extraction, form...

متن کامل

Approximate Closed-form Formulae for Buckling Analysis of Rectangular Tubes under Torsion

The buckling torque may be much less than the yield torque in very thin rectangular tubes under torsion. In this paper, simple closed-form formulae are presented for buckling analysis of long hollow rectangular tubes under torsion. By the presented formulae, one can obtain the critical torque or the critical angle of twist of the tube in terms of its geometrical parameters and material constant...

متن کامل

Expectation Backpropagation: Parameter-Free Training of Multilayer Neural Networks with Continuous or Discrete Weights

Multilayer Neural Networks (MNNs) are commonly trained using gradient descent-based methods, such as BackPropagation (BP). Inference in probabilistic graphical models is often done using variational Bayes methods, such as Expectation Propagation (EP). We show how an EP based approach can also be used to train deterministic MNNs. Specifically, we approximate the posterior of the weights given th...

متن کامل

Integrated Non-Factorized Variational Inference

We present a non-factorized variational method for full posterior inference in Bayesian hierarchical models, with the goal of capturing the posterior variable dependencies via efficient and possibly parallel computation. Our approach unifies the integrated nested Laplace approximation (INLA) under the variational framework. The proposed method is applicable in more challenging scenarios than ty...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004